Goto

Collaborating Authors

 differentiable curriculum


Data Parameters: A New Family of Parameters for Learning a Differentiable Curriculum

Neural Information Processing Systems

Recent works have shown that learning from easier instances first can help deep neural networks (DNNs) generalize better. However, knowing which data to present during different stages of training is a challenging problem. In this work, we address this problem by introducing data parameters. More specifically, we equip each sample and class in a dataset with a learnable parameter (data parameters), which governs their importance in the learning process. During training, at each iteration, as we update the model parameters, we also update the data parameters. These updates are done by gradient descent and do not require hand-crafted rules or design. When applied to image classification task on CIFAR10, CIFAR100,WebVision and ImageNet datasets, and object detection task on KITTI dataset, learning a dynamic curriculum via data parameters leads to consistent gains, without any increase in model complexity or training time. When applied to a noisy dataset, the proposed method learns to learn from clean images and improves over the state-of-the-art methods by 14%. To the best of our knowledge, our work is the first curriculum learning method to show gains on large scale image classification and detection tasks.


Reviews: Data Parameters: A New Family of Parameters for Learning a Differentiable Curriculum

Neural Information Processing Systems

This work proposes an optimization scheme for learning a curriculum over classes or training samples. The importance of each sample/class is reflected by a learnable parameter that is learned by gradient descent simultaneously with network weights. The proposed scheme particularly shows its advantage in noisy data as demonstrated empirically. All reviewers find their concerns well-addressed in authors' response, and they all find the paper a solid and interesting work.


Data Parameters: A New Family of Parameters for Learning a Differentiable Curriculum

Neural Information Processing Systems

Recent works have shown that learning from easier instances first can help deep neural networks (DNNs) generalize better. However, knowing which data to present during different stages of training is a challenging problem. In this work, we address this problem by introducing data parameters. More specifically, we equip each sample and class in a dataset with a learnable parameter (data parameters), which governs their importance in the learning process. During training, at each iteration, as we update the model parameters, we also update the data parameters.


Data Parameters: A New Family of Parameters for Learning a Differentiable Curriculum

Saxena, Shreyas, Tuzel, Oncel, DeCoste, Dennis

Neural Information Processing Systems

Recent works have shown that learning from easier instances first can help deep neural networks (DNNs) generalize better. However, knowing which data to present during different stages of training is a challenging problem. In this work, we address this problem by introducing data parameters. More specifically, we equip each sample and class in a dataset with a learnable parameter (data parameters), which governs their importance in the learning process. During training, at each iteration, as we update the model parameters, we also update the data parameters.